Toward Crowdsourced User Studies for Software Evaluation
نویسندگان
چکیده
This work-in-progress paper describes a vision, i.e., that of fast and reliable software user experience studies conducted with the help from the crowd. Commonly, user studies are controlled in-lab activities that require the instruction, monitoring, interviewing and compensation of a number of participants that are typically hard to recruit. The goal of this work is to study which user study methods can instead be crowdsourced to generic audiences to enable the conduct of user studies without the need for expensive lab experiments. The challenge is understanding how to conduct crowdsourced studies without giving up too many of the guarantees in-lab settings are able to provide. User studies are experimental and observational research methods for the measurement of an artifact’s properties as perceived by its users (we specifically focus on software artifacts, such as Web applications). They are, for instance, used to evaluate the strengths and weaknesses of different visualization techniques, to understand if theoretical principles hold in practical settings, to measure if requirements are met by a given software design, or to validate and test usability. Over the last decades, user studies have increasingly found their way into software engineering practice, and today it is almost impossible to find successful applications that do not consider the perception of their users. Facebook and Google, for example, can rely on an unprecedented user base to test new features on the fly and to adjust them according to observed performance or preferences. The problem is that not everybody has access to such a user base, e.g., because the own application has only a small target user group or because the application is still under development. Streamlining the necessary methods, involving the crowd, and providing user study support as a service while keeping study outputs reliable can thus make user studies significantly more accessible, to the benefit of everybody. The focus of this work is on how to crowdsource different user study methods conceptually and technically, i.e., on how to design effective tasks for user studies, gather and analyze data, guarantee quality and achieve representativeness. The question which method suits which research question is outside of its scope.
منابع مشابه
A Text Analyser of Crowdsourced Online Sources for Knowledge Discovery
In the last few years, Twitter has become the centre of crowdsourced-generated content. Numerous tools exist to analyse its content to lead to knowledge discovery. However, most of them focus solely on the content and ignore user features. Selecting and analysing user features such as user activity and relationships lead to the discovery of authorities and user communities. Such a discovery can...
متن کاملSoylent: A Word Processor with a Crowd Inside Citation
This paper introduces architectural and interaction patterns for integrating crowdsourced human contributions directly into user interfaces. We focus on writing and editing, complex endeavors that span many levels of conceptual and pragmatic activity. Authoring tools offer help with pragmatics, but for higher-level help, writers commonly turn to other people. We thus present Soylent, a word pro...
متن کاملA Design, Analysis and Evaluation Model to Support the Visualization Designer-User
Existing visualization design and evaluation frameworks rest on a distinction between the designer and the user. However, there is little explicit guidance on design, analysis and evaluation when the designer is the user. A simple solution to this problem is for the researcher (who combines the designer and user roles) to be clear about which activity they are conducting at which point in time....
متن کاملCrowdsourced Web Site Evaluation with CrowdStudy
Many different automatic usability evaluation tools have been specifically developed for web sites and web-based services, but they usually cannot replace user testing. At the same time, traditional usability evaluation methods can be both expensive and time consuming. We will demonstrate CrowdStudy, a toolkit for crowdsourced testing of web interfaces that allows, not only to efficiently recru...
متن کاملCan Laymen outperform Experts? The effects of User Expertise and Task Design in Crowdsourced Software Testing
In recent years, crowdsourcing has increasingly gained attention as a powerful sourcing mechanism for problem-solving in organizations. Depending on the type of activity addressed by crowdsourcing, the complexity of the tasks and the role of the crowdworkers may differ substantially. It is crucial that the tasks are designed and allocated according to the capabilities of the targeted crowds. In...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1609.01070 شماره
صفحات -
تاریخ انتشار 2016